Flexible sparse regularization

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-convex Sparse Regularization

We study the regularising properties of Tikhonov regularisation on the sequence space l with weighted, non-quadratic penalty term acting separately on the coefficients of a given sequence. We derive sufficient conditions for the penalty term that guarantee the well-posedness of the method, and investigate to which extent the same conditions are also necessary. A particular interest of this pape...

متن کامل

Sparse Trace Norm Regularization

We study the problem of estimating multiple predictive functions from a dictionary of basis functions in the nonparametric regression setting. Our estimation scheme assumes that each predictive function can be estimated in the form of a linear combination of the basis functions. By assuming that the coefficient matrix admits a sparse low-rank structure, we formulate the function estimation prob...

متن کامل

Sparse regularization for precipitation downscaling

[1] Downscaling of remotely sensed precipitation images and outputs of general circulation models has been a subject of intense interest in hydrometeorology. The problem of downscaling is basically one of resolution enhancement, that is, appropriately adding details or high frequency features onto a low-resolution observation or simulated rainfall field. Invoking the property of rainfall self s...

متن کامل

Well-posedness Classes for Sparse Regularization∗

Abstract. Because of their sparsity enhancing properties, `1 penalty terms have recently received much attention in the field of inverse problems. Also, it has been shown that certain properties of the linear operator A to be inverted imply that `1-regularization is equivalent to `0-regularization, which tries to minimise the number of non-zero coefficients. In the context of compressed sensing...

متن کامل

Group sparse regularization for deep neural networks

In this paper, we consider the joint task of simultaneously optimizing (i) the weights of a deep neural network, (ii) the number of neurons for each hidden layer, and (iii) the subset of active input features (i.e., feature selection). While these problems are generally dealt with separately, we present a simple regularized formulation allowing to solve all three of them in parallel, using stan...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Inverse Problems

سال: 2016

ISSN: 0266-5611,1361-6420

DOI: 10.1088/0266-5611/33/1/014002